ICA Using Kernel Entropy Estimation with NlogN Complexity
نویسندگان
چکیده
Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing nonparametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separation performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN logN +KN).
منابع مشابه
An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis
At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While ...
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کاملHarmonic Source Localization Approach Based on Fast Kernel Entropy Optimization ICA and Minimum Conditional Entropy
Abstract: Based on the fast kernel entropy optimization independent component analysis and the minimum conditional entropy, this paper proposes a harmonic source localization method which aims at accurately estimating harmonic currents and identifying harmonic sources. The injected harmonic currents are estimated by the fast kernel entropy optimization independent component analysis (FKEO-ICA) ...
متن کاملKernel Principal Components Are Maximum Entropy Projections
Principal Component Analysis (PCA) is a very well known statistical tool. Kernel PCA is a nonlinear extension to PCA based on the kernel paradigm. In this paper we characterize the projections found by Kernel PCA from a information theoretic perspective. We prove that Kernel PCA provides optimum entropy projections in the input space when the Gaussian kernel is used for the mapping and a sample...
متن کاملICA Using Spacings Estimates of Entropy
This paper presents a new algorithm for the independent components analysis (ICA) problem based on efficient spacings estimates of entropy. Like many previous methods, we minimize a standard measure of the departure from independence, the estimated Kullback-Leibler divergence between a joint distribution and the product of its marginals. To do this, we use a consistent and rapidly converging en...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004